فهرست مطالب

Scientia Iranica
Volume:29 Issue: 6, Nov-Dec 2022

  • تاریخ انتشار: 1401/09/05
  • تعداد عناوین: 11
|
  • M. Shojaee, S. Jafarian-Namin *, S.M.T. Fatemi Ghomi, D. M. Imani, A. Faraz, M. S. Fallahnezhad Pages 3369-3387
    One of the main criteria for judging about the power of control charts is their ability in fast detection of deviations and shifts in the process. Average time to signal (ATS) or adjusted average time to signal (AATS) are among such criteria calculated under a certain state and assumption. Several studies have shown that using the idea of ​​variable design for control charts, by separating their limits to the safe and the warning regions, can allow quick discovery of shifts and increase sensitivities to small changes. In this paper, a new variable sampling scheme with three sample sizes and two different sampling intervals, called SVSSI, is developed to increase the efficiency of the np control chart. Through various numerical examples, the performance of this scheme is evaluated by calculating ATS and AATS values using Markov chain method. Monte Carlo simulation method is used to validate the results of Markov chain method of SVSSI sampling scheme. In comparison with other schemes, better performance of applying SVSSI is proved in all conditions.
    Keywords: SVSSI scheme, np control chart, Average time to signal, Adjusted average time to signal, Markov chain
  • J. F. Munoz, P. J. Moya-Fernandez *, E. Alvarez, F. J. Blanco-Encomienda Pages 3388-3393
    The constant c4[n] is commonly used in the construction of control charts and the estimation of process capability indices, where n denotes the sample size. Assuming the Normal distribution the unbiased estimator of the population standard deviation is obtained by dividing the sample standard deviation by the constant c4[n]. An alternative expression for c4[n] is proposed, and the mathematical induction technique is used to prove its validity. Some desirable properties are described. First, the suggested expression provides the exact value of c4[n]. Second, it is not a recursive formula in the sense it does not depend on the previous sample size. Finally, the value of c4[n] can be directly computed for large sample sizes. Such properties suggest that the proposed expression may be a convenient solution in computer programming, and it has direct applications in statistical quality control.
    Keywords: Bias, standard deviation, gamma function, Control chart, Process capability index
  • Z. Khalaj, A. Aghaie *, Y. Samimi Pages 3394-3403
    By rapid advancements in technologies, studying and simulating a complex system with uncertain parameters is too demanding. Based on the literature, there are three approaches to identify and simulate different systems: engineering, statistical, and engineering-statistical approaches. The purpose of this study is to apply the engineering-statistical approach to calibration and adjustment of a Laser Assisted Micro-Machining (LAMM) process with two correlated outputs which are basically known as cutting and thrust forces. This paper contributes to the existing literature by extending the most relevant approach of the calibration of single-output complicated processes to multi-output settings where discrepancy function is modeled by a multivariate Gaussian process and multivariate analysis of variance is used to identify variables whose adjustment benefits the most. For the best case reported in previous studies, Mean Squared Prediction Error (MSPE), as the comparison index, was reported around 1.48 for thrust force whereas the proposed approach resulted in a better value of 1.9425×10-4. Moreover, for cutting force output, the index was obtained as 0.21 by the Kennedy and O’Hagan, 1.41 by Roshan and Yan, and 1.6×10-8 by the presented model. These values demonstrate reasonable and comparable results for the MSPE, in comparison with the models considering the outputs individually.
    Keywords: Uncertainty Quantification, Adjustment, Calibration, Engineering-Statistical Model, Gaussian Process, Laser-Assisted Micromachining Process
  • V. K. Chawla *, A. K. Chanda, S. Angra, A. Bonyadi Pages 3404-3417
    The use of material handling robots (MHRs) for efficient material handling operations in the flexible manufacturing systems (FMS) has gained wide popularity and acceptability across the automated production industries. The coexistent scheduling between jobs and MHRs improves the overall efficiency of the FMS significantly. In the present study, the coexistent scheduling between the MHRs and the jobs under production in the FMS is carried out by using an advanced grey wolf optimization (AGWO) algorithm. The proposed FMS layout is made up of the tandem flow path configurations for the movements of MHRs. The FMS constitutes six flexible manufacturing cells (FMCs) partitioned in six zones and served by six MHRs deployed in each partitioned zone for efficient material handling operations. To develop the coexistent schedule between MHRs and jobs, a combined objective function is formulated by combining the two diverging objectives and solved by using the AGWO algorithm. The combined objective function yield for coexistent production scheduling in FMS, operating with nineteen work centers (WC) and six MHRs to produce thirty-six types of jobs and sixty-six types of jobs in varying batch production quantities is also reported in the paper.
    Keywords: Advanced grey wolf optimization, Coexistent scheduling, Flexible manufacturing system, Material handling robots, Tandem flow path configurations
  • S. H. Mousavipour, H. Farughi *, F. Ahmadizar Pages 3418-3433
    This paper aims at introducing a novel bi-objective model for a Job Shop Scheduling Problem (JSSP) in order to minimize makespan and maximum tardiness simultaneously. Some realistic assumptions, i.e. Fuzzy processing times and due dates involving triangular possibility distributions, transportation times, availability constraints, modified position-based learning effects on processing times, and sum-of-processing-time based learning effects on duration of maintenance activities have been considered, to provide a more general and practical model for the JSSP. Based on the learning effects, Processing times decrease as a machine performs an operation frequently, and workers gain working skills and experiences. In this paper based on DeJong’s learning effect a novel and modified formulation has been proposed for this effect. According to the above-mentioned assumptions, a novel mixed-integer linear programming (MILP) model for the JSSP is suggested. The proposed model is first converted to an auxiliary crisp model, given that model is a possibilistic programming, it is then solved by the TH and ε-constraint methods for small instances, and the results are compared. For medium and large instances, five metaheuristic algorithms, including NSGA-ΙΙΙ, PESA-ΙΙ, SPEA-ΙΙ, NSGA-ΙΙ, and MOEA/D are utilized, and the results are finally compared on the basis of three performance metrics.
    Keywords: job shop scheduling problem, learning effects, availability constraints, transportation times
  • M. Jamshidi, M. Sanei *, A. Mahmoodirad Pages 3434-3454
    Data envelopment analysis (DEA) can be employed for investigating operation and evaluation of units as one of the most important concerns of managers. DEA is a linear programming technique for calculating relative performance of decision-making units (DMUs) with multiple inputs and outputs. Although all inputs and outputs are considered as certain items in these models, there are uncertain items in the real word and existing interference between these two concepts will result in uncertain models.Allocation models were studied in an uncertain environment with belief degree based uncertain input costs and output prices. Belief degree based uncertainty is useful for cases where there is no historical information on an uncertain event. Utilizing the uncertain entropy model as a second objective function, the cost and revenue models showed an optimal performance with a maximum dispersion rate in their constituent components. As a solution methodology, the uncertain allocation models were separately converted into crisp models by expect value (EV) and expected value and chance-constrained (EVCC) methods. A practical example from the Iranian Stock Market was also presented to evaluate the performance of the new model.
    Keywords: Data envelopment analysis, Uncertainty theory, Belief degree, Stock market
  • A. Eydi *, H. Mohagheghi, S. A. Ghasemi-Nezhad Pages 3455-3469
    Due to the importance of routing order pickers, there has been extensive research in the area of routing in warehouses. Still, there are some prominent factors that should receive more attention, as they may provide unsatisfactory services and incur considerable operational costs if ignored. In real-world applications, warehouse configuration, width of aisles, and controlling the vehicle congestion in the aisles greatly influence the efficiency of the routing process. Therefore, this paper proposes a mixed-integer programming model. The model aims to minimize maximum delivery time by finding the shortest pickup and delivery routes of all goods for all vehicles. Since the problem is NP-hard, a Simulated Annealing metaheuristic approach is designed to solve the model in large-size problems. This research contributes to picker routing literature by considering dynamic congestion, narrow and wide aisles, and pickup times and proposing a metaheuristic algorithm. The validation and efficiency of our proposed model are proved by solving some various generated benchmark problems. In summary, the developed route planning mathematical model works effectively for any two-dimensional rectangular layout, and the collision prevention constraints are incorporated in the mathematical model.
    Keywords: picker routing in warehouse, aisle width, Congestion, Simulated annealing
  • N. Monazzam, A. Alinezhad *, M. A. Adibi Pages 3470-3488
    Production System (PS) is the process of planning, organizing, directing and controlling the tactical and strategic planning of the different components of the company, to transform inputs into finished products which must be effectively managed.the present study aimed to increasing the efficiency and determining the useful methods to evaluate and optimize the performance in different part of PS. To this end, an integrated Discrete-event Simulation (DES), Design of Experiments (DOE), Data Envelopment Analysis (DEA), and Multi-attribute Decision Making (MADM) models were implemented to analyze and optimize the real PS process. In the case study of the automobile manufacturing industry in Iran, the accurate analysis was applied to the proposed approach and its different aspects were considered as well. The results indicated that the proposed approach is a practical way for evaluating and optimizing the performance of different part of PS, compared to previous models and helps the manufacturing companies to make efficient decisions regarding increasing productivity while decreasing the essential problems.
    Keywords: Production System, Simulation, DOE, DEA, SWARA
  • M. Moradi *, M. Modarres, M. M. Sepehri Pages 3489-3504
    Prescribing and consuming drugs more than necessary is considered as polypharmacy, which is both wasteful and harmful. The purpose of this paper is to establish an innovative data mining framework for analyzing physicians’ prescriptions regarding polypharmacy. The approach consists of three main steps: pre-modeling, modeling, and post-modeling. In the first step, after collecting and cleaning the raw data, several novel physicians’ features are extracted. In the modeling step, two popular decision trees, i.e., C4.5 and Classification and Regression Tree (CART), are applied to generate a set of If-Then rules in a tree-shaped structure to detect and describe physicians’ features associated with polypharmacy. In a novel approach, the response surface method (RSM) as a tool for hyper-parameter tuning is simultaneously applied along with correlation-based feature selection (CFS) to enhance the performance of the algorithms. In the post-modeling step, the discovered knowledge is visualized to make the results more perceptible, then is presented to domain experts to evaluate whether they make sense or not. The framework has been applied to a real-world dataset of prescriptions. The results have been confirmed by the experts, which demonstrates the capabilities of the data mining framework in the detection and analysis of polypharmacy.
    Keywords: Decision Tree, CART, C4.5, Parameter tuning, response surface method (RSM), rational use of drugs
  • A. Mansouri, A. Alam-Tabriz * Pages 3505-3522
    The key issue in this study has been the integration of redundancy allocation and optimization of failure rates. The fact that we only need to work with parallel allocation and increase the number of components in a subsystem in parallel to improve the reliability or availability of parallel-series systems is necessary, but not enough. It is worth mentioning that in this research, the improvement of failure rates of different components in the system has been studied. In the meantime, it is important to note that with careful study of the effects of each of these approaches and the costs imposed on the system, the design problem data will be formed. Considering that more effort to improve the reliability of components leads to less redundancy allocation and vice versa, the optimization problem is performed to determine the exact number of redundancies along with determining the exact amount of improvements in complete failure rates. In this research, the satellite attitude determination and control system, the structure of the studied system and its components is introduced, then the reliability in this system is modeled and optimized with a mathematical approach based on the combination of reliability allocation and redundancy allocation.
    Keywords: Reliability, Satellite, attitude determination, control system, Redundancy allocation
  • M. Sharifi *, M. Shahriyari, A. Khajehpoor, S. A. Mirtaheri Pages 3523-3541
    In this research, a new hybrid model for the redundancy allocation problem (RAP) in a series-parallel configuration with the k-out-of-n subsystem is presented. In the given model, the redundancy policy is set to an active, warm standby, or no redundancy. In warm standby policy, an imperfect switch detected the component's failure and replaced the fail component with a new standby one. So, the subsystems' redundancy policy is one of the model's decision variables. We presented a new objective function for the RAP to calculate the reliability of a system that consists of active and warm standby subsystems. The presented model aims to determine the subsystems' redundancy policy, the type and number of redundant components to maximize the system's reliability, under the system's cost, volume, and weight constraints. To solve the proposed model, we used two Genetic Algorithm (GA) and hybrid GA (HGA) meta-heuristic algorithm with local search. Since the %RPD of HGA is 2.1% (on average) better than GA in solving ten large-scale instances, the result shows the superiority of HGA in comparison with GA for solving the presented RAP.
    Keywords: Redundancy allocation problem, warm standby, Reliability, Meta-heuristic methods